# Small Parameter Size

T5 Small Text Summarization
Apache-2.0
A text summarization model fine-tuned on the xsum dataset based on the T5-small architecture, capable of generating concise summaries.
Text Generation Transformers
T
bhuvaneswari
27
0
Japanese Parler Tts Mini Bate
Other
Parler-TTS Mini v1 is a small Japanese text-to-speech model that supports high-quality speech synthesis.
Speech Synthesis Transformers Japanese
J
2121-8
184
12
Llama Lite 134m
Apache-2.0
134M parameters, 768-dimensional Llama Lite for generating sentence embeddings
Text Embedding Transformers
L
skeskinen
93
13
Electra Contrastdata Squad
Apache-2.0
This model is a fine-tuned version of the ELECTRA-small discriminator based on the SQuAD dataset, suitable for question-answering tasks.
Question Answering System Transformers
E
mlxen
19
0
Mt5 Small Finetuned Amazon En Zh TW
Apache-2.0
This model is a text summarization model fine-tuned on the Amazon dataset based on google/mt5-small, supporting English to Traditional Chinese abstract generation tasks.
Text Generation Transformers
M
peterhsu
28
0
Tapas Small Masklm
TAPAS (Table Parser) is a table-based pre-trained language model developed by Google Research, specifically designed for processing tabular data and natural language queries.
Large Language Model Transformers
T
google
14
1
Roformer Chinese Small
RoFormer is a Transformer model enhanced by Rotary Position Embedding (RoPE), suitable for Chinese text processing tasks.
Large Language Model Chinese
R
junnyu
599
2
Deberta V3 Small Finetuned Cola
MIT
This model is a fine-tuned version of DeBERTa-v3-small on the GLUE COLA dataset for linguistic acceptability judgment tasks.
Text Classification Transformers English
D
mrm8488
16
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase